L1-Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs

نویسندگان

  • Matey Neykov
  • Jun S. Liu
  • Tianxi Cai
چکیده

It is known that for a certain class of single index models (SIMs) [Formula: see text], support recovery is impossible when X ~ 𝒩(0, 𝕀 p×p ) and a model complexity adjusted sample size is below a critical threshold. Recently, optimal algorithms based on Sliced Inverse Regression (SIR) were suggested. These algorithms work provably under the assumption that the design X comes from an i.i.d. Gaussian distribution. In the present paper we analyze algorithms based on covariance screening and least squares with L1 penalization (i.e. LASSO) and demonstrate that they can also enjoy optimal (up to a scalar) rescaled sample size in terms of support recovery, albeit under slightly different assumptions on f and ε compared to the SIR based algorithms. Furthermore, we show more generally, that LASSO succeeds in recovering the signed support of β0 if X ~ 𝒩 (0, Σ), and the covariance Σ satisfies the irrepresentable condition. Our work extends existing results on the support recovery of LASSO for the linear model, to a more general class of SIMs.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Simultaneous support recovery in high dimensions: Benefits and perils of block l1/l∞-regularization

Given a collection of r ≥ 2 linear regression problems in p dimensions, suppose that the regression coefficients share partially common supports of size at most s. This set-up suggests the use of l1/l∞-regularized regression for joint estimation of the p × r matrix of regression coefficients. We analyze the high-dimensional scaling of l1/l∞-regularized quadratic programming, considering both co...

متن کامل

Block Regularized Lasso for Multivariate Multi-Response Linear Regression

The multivariate multi-response (MVMR) linear regression problem is investigated, in which design matrices are Gaussian with covariance matrices Σ = ( Σ, . . . ,Σ ) for K linear regressions. The support union of K p-dimensional regression vectors (collected as columns of matrix B∗) are recovered using l1/l2-regularized Lasso. Sufficient and necessary conditions to guarantee successful recovery ...

متن کامل

A Unified Approach to Model Selection and Sparse Recovery Using Regularized Least Squares1 by Jinchi

Model selection and sparse recovery are two important problems for which many regularization methods have been proposed. We study the properties of regularization methods in both problems under the unified framework of regularized least squares with concave penalties. For model selection, we establish conditions under which a regularized least squares estimator enjoys a nonasymptotic property, ...

متن کامل

Retrieving Three Dimensional Displacements of InSAR Through Regularized Least Squares Variance Component Estimation

Measuring the 3D displacement fields provide essential information regarding the Earth crust interaction and the mantle rheology. The interferometric synthetic aperture radar (InSAR) has an appropriate capability in revealing the displacements of the Earth’s crust. Although, it measures the real 3D displacements in the line of sight (LOS) direction. The 3D displacement vectors can be retrieved ...

متن کامل

Estimation with Norm Regularization

Analysis of non-asymptotic estimation error and structured statistical recovery based on norm regularized regression, such as Lasso, needs to consider four aspects: the norm, the loss function, the design matrix, and the noise model. This paper presents generalizations of such estimation error analysis on all four aspects compared to the existing literature. We characterize the restricted error...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Journal of machine learning research : JMLR

دوره 17 1  شماره 

صفحات  -

تاریخ انتشار 2016